Fixed-Weight Difference Target Propagation

نویسندگان

چکیده

Target Propagation (TP) is a biologically more plausible algorithm than the error backpropagation (BP) to train deep networks, and improving practicality of TP an open issue. methods require feedforward feedback networks form layer-wise autoencoders for propagating target values generated at output layer. However, this causes certain drawbacks; e.g., careful hyperparameter tuning required synchronize training, frequent updates path are usually that path. Learning sufficient make capable but having these necessary condition work? We answer question by presenting Fixed-Weight Difference (FW-DTP) keeps weights constant during training. confirmed simple method, which naturally resolves abovementioned problems TP, can still deliver informative hidden layers given task; indeed, FW-DTP consistently achieves higher test performance baseline, (DTP), on four classification datasets. also present novel propagation architecture explains exact function DTP analyze FW-DTP. Our code available https://github.com/TatsukichiShibuya/Fixed-Weight-Difference-Target-Propagation.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Difference Target Propagation

Back-propagation has been the workhorse of recent successes of deep learning but it relies on infinitesimal effects (partial derivatives) in order to perform credit assignment. This could become a serious issue as one considers deeper and more non-linear functions, e.g., consider the extreme case of nonlinearity where the relation between parameters and cost is actually discrete. Inspired by th...

متن کامل

Fixed Target Energies

We calculated the color-octet contribution to the J/ψ hadroproduction at fixed target energies (s) ≃ 40 GeV. We consider the J/ψ production with transverse momenta which can not be explained by primordial motion of par-tons, p T > 1.5 GeV. It is shown that color octet contribution is dominant at these energies and reduces large dicrepancies between experimental data and color singlet model pred...

متن کامل

Back-Propagation Without Weight Transport

In back-propagation (Rumelhart et al, 1985) connection weights are used to both compute node activations and error gradients for hidden units. Grossberg (1987) has argued that the dual use of the same synaptic connections (“weight transport”) constitutes a bidirectional flow of information through synapses, which is biologically implausable. In this paper we formally and empirically demonstrate...

متن کامل

Fixed Point Solutions of Belief Propagation

Belief propagation (BP) is an iterative method to perform approximate inference on arbitrary graphical models. Whether BP converges and if the solution is a unique fixed point depends on both, the structure and the parametrization of the model. To understand this dependence it is interesting to find all fixed points. In this work, we formulate a set of polynomial equations, the solutions of whi...

متن کامل

Fourier finite - difference wave propagation a

We introduce a novel technique for seismic wave extrapolation in time. The technique involves cascading a Fourier Transform operator and a finite difference operator to form a chain operator: Fourier Finite Differences (FFD). We derive the FFD operator from a pseudo-analytical solution of the acoustic wave equation. 2-D synthetic examples demonstrate that the FFD operator can have high accuracy...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i8.26171